Improving Gibbs Sampler Scan Quality with DoGS

نویسندگان

  • Ioannis Mitliagkas
  • Lester W. Mackey
چکیده

The pairwise influence matrix of Dobrushin has long been used as an analytical tool to bound the rate of convergence of Gibbs sampling. In this work, we use Dobrushin influence as the basis of a practical tool to certify and efficiently improve the quality of a Gibbs sampler. Our Dobrushin-optimized Gibbs samplers (DoGS) offer customized variable selection orders for a given sampling budget and variable subset of interest, explicit bounds on total variation distance to stationarity, and certifiable improvements over the standard systematic and uniform random scan Gibbs samplers. In our experiments with joint image segmentation and object recognition, Markov chain Monte Carlo maximum likelihood estimation, and Ising model inference, DoGS consistently deliver higher-quality inferences with significantly smaller sampling budgets than standard Gibbs samplers.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Subsampling the Gibbs Sampler: Variance Reduction

Subsampling the output of a Gibbs sampler in a non-systematic fashion can improve the e ciency of marginal estimators if the subsampling strategy is tied to the actual updates made. We illustrate this point by example, approximation, and asymptotics. The results hold both for random scan and xed scan Gibbs samplers.

متن کامل

Parallel Gibbs Sampling: From Colored Fields to Thin Junction Trees

We explore the task of constructing a parallel Gibbs sampler, to both improve mixing and the exploration of high likelihood states. Recent work in parallel Gibbs sampling has focused on update schedules which do not guarantee convergence to the intended stationary distribution. In this work, we propose two methods to construct parallel Gibbs samplers guaranteed to draw from the targeted distrib...

متن کامل

Implementing Random Scan Gibbs Samplers I

The Gibbs sampler, being a popular routine amongst Markov chain Monte Carlo sampling methodologies, has revolutionized the application of Monte Carlo methods in statistical computing practice. The performance of the Gibbs sampler relies heavily on the choice of sweep strategy, that is, the means by which the components or blocks of the random vector X of interest are visited and updated. We dev...

متن کامل

Comment: On Random Scan Gibbs Samplers

We congratulate the authors on a review of convergence rates for Gibbs sampling routines. Their combined work on studying convergence rates via orthogonal polynomials in the present paper under discussion (which we will denote as DKSC from here onward), via coupling in Diaconis, Khare and SaloffCoste (2006), and for multivariate samplers in Khare and Zhou (2008), enhances the toolbox of theoret...

متن کامل

Surprising Convergence Properties of Some Simple Gibbs Samplers Under Various Scans

We examine the convergence properties of some simple Gibbs sampler examples under various scans. We find some surprising results, including Gibbs samplers where deterministic-scan is much more efficient than random-scan, and other samplers where the opposite is true. We also present an example where the convergence takes precisely the same time with any fixed deterministic scan, but modifying t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017